The dot product imprinted burliness, overprinter quasi-three, restore, see double streak-free.
这些产单弱、网点原好、套印、、轻影。
声明:以上例句、词性分类均由互联网资源自动生成,部分未经过人工审核,其表达内容亦不代表本软件的观点;若发现问题,欢迎向我们指正。
Rather than using the dot product equation, let's use the electric flux equation without the dot product.
们不使用点积程,而是使用不带点积的电程。
Electric flux equals the integral of the dot product of electric field and dA.
电等于电场与dA的点积的积分。
However, let's use E dA cosine theta instead of the dot product.
但是,让们使用 E dA 余弦西塔来代替点积。
Electric flux equals the dot product of the electric field and the area, so let's use that.
电等于电场与面积的点积,所们就使用它。
Solving a linear system with an orthonormal matrix is actually super easy, because dot products are preserved.
用标准正交矩阵求解线性程组其实非常简单 因为点积被保留了。
That is, they don't preserve that zero dot product.
也就是说 它们不保留0点积。
The dot product before and after the transformation will look very different.
变换前后的点积起来很不一样。
And looking at the example I just showed, dot products certainly aren't preserved.
才举的例子 点积肯定不会被保留。
The relevant background here is understanding determinance, little bit of dot products, and of course, linear systems of equations.
相关的背景知识是理解行列式 一点点积 当然还有线性程组。
In the sense that applying the transformation is the same thing as taking a dot product with that vector.
在这个意义上应用这个变换就相当于对这个向做点积。
In fact, worthwhile side note here transformations which do preserve dot products are special enough to have their own name.
事实上 值得注意的是保留点积的变换很特别 有自己的名字。
And if they point in generally the opposite direction, their dot product is negative.
如果它们常指向相反的向 它们的点积是负的。
The excordinate of this mystery input vector is what you get by taking its dot product with the 1st basis vector one zero.
这个神秘的输入向的纵坐标就是它和第一个基向(0)的点积。
Likewise, things that start off perpendicular with dot product zero, like the two basis factors, quite often don't stay perpendicular to each other after the transformation.
同样地 开始与点积0垂直的东西 比如两个基因子 在变换之后常不会相互垂直。
When their perpendicular meaning, the projection of one onto the other, is the zero vector, their dot product is zero.
当它们垂直的意思 一个向在另一个向上的投影 是零向时 它们的点积是零。
And that's probably the most important thing for you to remember about the dot product.
这可能是你们要记住的关于点积最重要的一点。
Looking at sides 2 and 4, we need to realize the dot product of B and ds, is the same as B ds cosine theta, Right!
观察边2和边4, 们需要实现B和ds的点积,与B ds cosine theta相同, 对吧!
So that wraps up dot products and cross products.
这就包含了点积和叉乘。
So that performing the linear transformation is the same as taking a dot product with that vector the cross product.
所进行线性变换就等于对这个向做点积也就是叉乘。
Now, this numerical operation of multiplying a one by two matrix by a vector feels just like taking the dot product of two vectors.
现在 这个1×2矩阵乘一个向的数值运算就像取两个向的点积。
关注我们的微信
下载手机客户端
划词翻译
详细解释